Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ndarray: porting Python's autograd #274

Merged
merged 29 commits into from
Dec 18, 2017
Merged

ndarray: porting Python's autograd #274

merged 29 commits into from
Dec 18, 2017

Conversation

@iblislin iblislin changed the title porting Python's autograd WIP: porting Python's autograd Sep 18, 2017
@iblislin
Copy link
Member Author

iblislin commented Sep 18, 2017

TODO:

  • port feature: user-defined differentiable function
    • In Python, it provides the class Function. I also curious about how this could be implemented in Julia. If we defined a abstract type like CustomFunc, how can we make it callable?
  • test cases:
    • attach_grad!
    • getgrad
    • mark_variables!
    • symbol
    • record
    • pause
    • train_mode
    • predict_mode
    • backward!
  • leverage grad_req_map from update OpReqType enum #283
  • update NEWS.md
  • docs
  • figure out this: ndarray: porting Python's autograd #274 (comment)

@vchuravy
Copy link
Collaborator

We don't have custom operators yet so I don't thing we need user-defined differential functions, (it is actually a very hard problem to solve in Julia...)

@iblislin
Copy link
Member Author

In Python, this shows that we only need one c api: MXCustomFunctionRecord.
So.. IMO we still can have user-defined diff funcs.
Also, IIUC, make_symbol translates the style of NDArray (dynamic) to Sybolic (static). Maybe AutoGrad can reveal a another way to create custom layer via combining make_symbol and user funcs?

About implementation of user funcs, maybe meta programming can help, just my imagination:

@mx.registerfunc MyFunc  # create callable function for it
forward(f::Myfunc, ...) = ...
backward(f::Myfunc, ...) = ...

but not sure where is the pitfall of implementing it.

@iblislin
Copy link
Member Author

iblislin commented Sep 18, 2017

found a problematic case:

julia> x = mx.NDArray([1 2; 3 4])
       mx.attach_grad(x)
       
       y = mx.record() do
         1x .* x
       end
       
       mx.backward(y)

julia> copy(mx.grad(x))
2×2 Array{Int64,2}:
 1   4
 9  16

It's gradient shoud be same as the one in commit message.
something wrong in broadcast?

@vchuravy
Copy link
Collaborator

I would leave custom functions completely out of the picture for now. The problems and pitfalls are plenty and I am currently not convinced there is a good way to implement them (outside of #173, which is currently stalled).

In order to make customs function work you need to guarantee that they are never called asynchronously.

@iblislin
Copy link
Member Author

In order to make customs function work you need to guarantee that they are never called asynchronously.

which doc of mxnet c api can I consult? still can not understande what is the issue of aynchronous (io?).

@vchuravy
Copy link
Collaborator

See the discussion and linked abandoned PRs in #166

@pluskid
Copy link
Member

pluskid commented Sep 19, 2017

Thanks! About custom OP, I think the brief summary is that libmxnet has a async engine when executing operators, so the Julia callback will be called in a different thread than the main thread. Due to the current incomplete support for multi-thread GC in Julia, a callback being called in a different thread from the C++ side is not guaranteed to function properly.

@codecov-io
Copy link

codecov-io commented Sep 25, 2017

Codecov Report

Merging #274 into master will increase coverage by 1.2%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master     #274     +/-   ##
=========================================
+ Coverage   69.74%   70.94%   +1.2%     
=========================================
  Files          25       26      +1     
  Lines        1963     2034     +71     
=========================================
+ Hits         1369     1443     +74     
+ Misses        594      591      -3
Impacted Files Coverage Δ
src/ndarray.jl 82.36% <ø> (-0.29%) ⬇️
src/base.jl 34.09% <ø> (+1.48%) ⬆️
src/autograd.jl 100% <100%> (ø)
src/util.jl 35.23% <0%> (-1.22%) ⬇️
src/executor.jl 84.33% <0%> (ø) ⬆️
src/broadcast.jl 100% <0%> (+66.66%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update db09528...0b11e4a. Read the comment docs.

@iblislin
Copy link
Member Author

iblislin commented Oct 7, 2017

For record, here is an API changes:

@iblislin iblislin force-pushed the issue-274 branch 3 times, most recently from 3146501 to f3e7bd6 Compare October 8, 2017 14:24
@iblislin iblislin force-pushed the issue-274 branch 3 times, most recently from 46b4548 to e96f0ce Compare November 9, 2017 07:24
@iblislin
Copy link
Member Author

iblislin commented Dec 7, 2017

https://github.com/apache/incubator-mxnet/blob/7484437d21c5e1456c24bff31e6642cf0a47db4d/src/imperative/imperative.cc#L182-L197

I ran into this error with recent libmxnet master,
seems that we should get rid of any use of op(...; out = ...)

iblislin added a commit that referenced this pull request Dec 7, 2017
iblislin added a commit that referenced this pull request Dec 7, 2017
iblislin added a commit that referenced this pull request Dec 7, 2017
address #274 (comment)

Although this patch cannot pass `@inferred`, but `code_warntype` give me
this:
```julia
end::MXNet.mx.NDArray{_,_} where _ where _
```

And seems it doesn't hurt performance.
iblislin added a commit to iblislin/MXNet.jl that referenced this pull request Dec 7, 2017
iblislin added a commit to iblislin/MXNet.jl that referenced this pull request Dec 7, 2017
address dmlc#274 (comment)

Although this patch cannot pass `@inferred`, but `code_warntype` give me
this:
```julia
end::MXNet.mx.NDArray{_,_} where _ where _
```

And seems it doesn't hurt performance.
@iblislin
Copy link
Member Author

I think this PR is ready for review.
I will send another PR for doc and docstring.

@iblislin iblislin requested a review from pluskid December 11, 2017 06:38
@iblislin iblislin changed the title WIP: porting Python's autograd ndarray: porting Python's autograd Dec 11, 2017
@iblislin
Copy link
Member Author

I try to implement user-defined function.
will send a PR after this PR merged.
https://github.com/dmlc/MXNet.jl/tree/ib/nd-ad-customfunc

@iblislin iblislin added this to the 0.4.0 milestone Dec 12, 2017
@iblislin
Copy link
Member Author

@pluskid need more time to review?

@iblislin
Copy link
Member Author

travis failure is macOS build timeout.

@iblislin iblislin merged commit cb443b7 into dmlc:master Dec 18, 2017
@iblislin iblislin deleted the issue-274 branch December 18, 2017 12:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants